656 research outputs found
Système de suivi et d'analyse des cambriolages appliqué dans des polices suisses
Un dispositif moderne d'analyse de la délinquance sérielle s'est développé depuis de nombreuses années dans plusieurs cantons suisses. Ses fondements théoriques sont compatibles avec les recherches académiques effectuées durant les 15 dernières années. Il se base en particulier sur l'exploitation systématique des traces matérielles, ainsi que sur un système de classification fondé sur les approches situationnelles. Cet instrument a évolué de manière itérative, en recherchant systématiquement un bon équilibre entre l'efficacité de la détection de séries et la simplicité requise par les contraintes pragmatiques. Un certain degré d'harmonisation a été également atteint grâce à un système de codification appliqué dans les centres régionaux d'analyse. Ces méthodes sont à concevoir dans un système complexe qui implique l'ensemble des membres de l'organisation. L'implication à différents niveaux de responsabilité varie beaucoup d'une police à une autre. De plus, la tentation d'orienter les méthodes autour d'outils particuliers comme des systèmes d'information géographique ou des banques de données spécifiques, peut considérablement diminuer l'efficacité du système. Inondés de promesses dans un contexte commercial agressif, les managers se laissent trop souvent séduire par ce genre de développement sans véritable fondement scientifique. Cette approche orientée sur le moyen (la technique) plus que sur la fin et la méthode constitue un risque fondamental pour la pérennité de ces structures. En effet, l'échec d'un outil acheté est ensuite trop souvent assimilé à l'échec de la structure. Toutefois, la solidité du dispositif développé malgré les nombreux obstacles rencontrés montre que sa place et ses méthodes sont maintenant mieux reconnues. Le développement du renseignement criminel sur le plan européen imposera à la Suisse une réflexion plus globale sur son système de renseignement aux niveaux central, régional et local, mais les infrastructures qui existent déjà permettront cette adaptation
La statistique policière de la criminalité en Suisse peut-elle s'envisager dans le cadre du renseignement criminel
Les polices suisses s'orientent vers un fonctionnement davantage basé sur le renseignement criminel, notamment pour s'aligner sur les évolutions du système de sécurité européen. Parallèlement, la nouvelle statistique policière de la criminalité nécessite un engagement considérable de ressources pour adapter les systèmes informatisés et pour saisir les données. On peut postuler que cette statistique ne sera utile aux polices que si elle peut s'intégrer dans un modèle cohérent de renseignement. La comparaison entre le cycle du renseignement et les phases de production de la statistique policière montre le potentiel des données statistiques dans ce cadre. Cette approche met aussi en évidence une série de difficultés fondamentales que le nouveau système très ambitieux ne pourra pas surmonter. Finalement, il y a un urgent besoin en Suisse de définir et mettre en oeuvre une véritable architecture cohérente qui assure la fluidité des flux d'information à tous les niveaux géographiques afin de favoriser la pratique du renseignement criminel et y intégrer l'exploitation des statistiques policières
A resource-based analysis of the Gonzaga University men’s basketball program
In recent years, the Resource-Based View (RBV) of firms has been applied to strategic management in the context of sport. Numerous studies have examined the relationship between human resource management and athletic performance, as well as success in sport sponsorship. Other studies employed the RBV in the context of professional sports franchises and a major NCAA athletic program. This paper builds on previous research by using the RBV to show how the effective management of a strategic resource has led to a sustained competitive advantage for the Gonzaga University men’s basketball program. The key resource of the Gonzaga program is identified and evaluated in the context of the RBV, and the strategic decisions made to manage that resource and ultimately create a sustainable competitive advantage are discussed
The Hubble Constant determined through an inverse distance ladder including quasar time delays and Type Ia supernovae
Context. The precise determination of the present-day expansion rate of the
Universe, expressed through the Hubble constant , is one of the most
pressing challenges in modern cosmology. Assuming flat CDM,
inference at high redshift using cosmic-microwave-background data from Planck
disagrees at the 4.4 level with measurements based on the local
distance ladder made up of parallaxes, Cepheids and Type Ia supernovae (SNe
Ia), often referred to as "Hubble tension". Independent,
cosmological-model-insensitive ways to infer are of critical importance.
Aims. We apply an inverse-distance-ladder approach, combining strong-lensing
time-delay-distance measurements with SN Ia data. By themselves, SNe Ia are
merely good relative distance indicators, but by anchoring them to strong
gravitational lenses one can obtain an measurement that is relatively
insensitive to other cosmological parameters. Methods. A cosmological parameter
estimate is performed for different cosmological background models, both for
strong-lensing data alone and for the combined lensing + SNe Ia data sets.
Results. The cosmological-model dependence of strong-lensing measurements
is significantly mitigated through the inverse distance ladder. In combination
with SN Ia data, the inferred consistently lies around 73-74 km s
Mpc, regardless of the assumed cosmological background model. Our
results agree nicely with those from the local distance ladder, but there is a
>2 tension with Planck results, and a ~1.5 discrepancy with
results from an inverse distance ladder including Planck, Baryon Acoustic
Oscillations and SNe Ia. Future strong-lensing distance measurements will
reduce the uncertainties in from our inverse distance ladder.Comment: 5 pages, 3 figures, A&A letters accepted versio
Targeting NaPi2b in ovarian cancer.
Novel biomarkers are needed to direct new treatments for ovarian cancer, a disease for which the standard of care remains heavily focused on platinum-based chemotherapy. Despite the success of PARP inhibitors, treatment options are limited, particularly in the platinum-resistant setting. NaPi2b is a cell surface sodium-dependent phosphate transporter that regulates phosphate homeostasis under normal physiological conditions and is a lineage marker that is expressed in select cancers, including ovarian, lung, thyroid, and breast cancers, with limited expression in normal tissues. Based on its increased expression in ovarian tumors, NaPi2b is a promising candidate to be studied as a biomarker for treatment and patient selection in ovarian cancer. In preclinical studies, the use of antibodies against NaPi2b showed that this protein can be exploited for tumor mapping and therapeutic targeting. Emerging data from phase 1 and 2 clinical trials in ovarian cancer have suggested that NaPi2b can be successfully detected in patient biopsy samples using immunohistochemistry, and the NaPi2b-targeting antibody-drug conjugate under evaluation appeared to elicit therapeutic responses. The aim of this review is to examine literature supporting NaPi2b as a novel biomarker for potential treatment and patient selection in ovarian cancer and to discuss the critical next steps and future analyses necessary to drive the study of this biomarker and therapeutic targeting forward
Recommended from our members
Pilot-Plant Development of a Rover Waste Calcination Flowsheet
Results of eight runs, six using the 10-cm dia and two using the 30-cm dia pilot-plant calciners, in which simulated first-cycle Rover waste was calcined, are described. Results of the tests showed that a feed blend consisting of one volume simulated first-cycle Rover waste and one or two volumes simulated first-cycle zirconium waste could not be successfully calcined. 5 figs., 8 tables
Recommended from our members
The Hubble Constant from Strongly Lensed Supernovae with Standardizable Magnifications
Abstract
The dominant uncertainty in the current measurement of the Hubble constant (H
0) with strong gravitational lensing time delays is attributed to uncertainties in the mass profiles of the main deflector galaxies. Strongly lensed supernovae (glSNe) can provide, in addition to measurable time delays, lensing magnification constraints when knowledge about the unlensed apparent brightness of the explosion is imposed. We present a hierarchical Bayesian framework to combine a data set of SNe that are not strongly lensed and a data set of strongly lensed SNe with measured time delays. We jointly constrain (i) H
0 using the time delays as an absolute distance indicator, (ii) the lens model profiles using the magnification ratio of lensed and unlensed fluxes on the population level, and (iii) the unlensed apparent magnitude distribution of the SN population and the redshift–luminosity relation of the relative expansion history of the universe. We apply our joint inference framework on a future expected data set of glSNe and forecast that a sample of 144 glSNe of Type Ia with well-measured time series and imaging data will measure H
0 to 1.5%. We discuss strategies to mitigate systematics associated with using absolute flux measurements of glSNe to constrain the mass density profiles. Using the magnification of SN images is a promising and complementary alternative to using stellar kinematics. Future surveys, such as the Rubin and Roman observatories, will be able to discover the necessary number of glSNe, and with additional follow-up observations, this methodology will provide precise constraints on mass profiles and H
0.</jats:p
TDCOSMO XI. Automated Modeling of 9 Strongly Lensed Quasars and Comparison Between Lens Modeling Software
To use strong gravitational lenses as an astrophysical or cosmological probe,
models of their mass distributions are often needed. We present a new,
time-efficient automation code for uniform modeling of strongly lensed quasars
with GLEE, a lens modeling software, for high-resolution multi-band data. By
using the observed positions of the lensed quasars and the spatially extended
surface brightness distribution of the lensed quasar host galaxy, we obtain a
model of the mass distribution of the lens galaxy. We apply this uniform
modeling pipeline to a sample of nine strongly lensed quasars with HST WFC 3
images. The models show in most cases well reconstructed light components and a
good alignment between mass and light centroids. We find that the automated
modeling code significantly reduces the user input time during the modeling
process. The preparation time of required input files is reduced significantly.
This automated modeling pipeline can efficiently produce uniform models of
extensive lens system samples which can be used for further cosmological
analysis. A blind test through a comparison with the results of an independent
automated modeling pipeline based on the modeling software Lenstronomy reveals
important lessons. Quantities such as Einstein radius, astrometry, mass
flattening and position angle are generally robustly determined. Other
quantities depend crucially on the quality of the data and the accuracy of the
PSF reconstruction. Better data and/or more detailed analysis will be necessary
to elevate our automated models to cosmography grade. Nevertheless, our
pipeline enables the quick selection of lenses for follow-up monitoring and
further modeling, significantly speeding up the construction of
cosmography-grade models. This is an important step forward to take advantage
of the orders of magnitude increase in the number of lenses expected in the
coming decade.Comment: 36 pages, 13 figures, submitted to A&
- …